Compare test case results with expected results

Compare test case results with expected results

To compare the test case results with the expected results you need to run the test scripts for a rulebase. A report will the be generated which shows the results.

What do you want to do?

Run a single test script

Run multiple test scripts

View the test results

Customize the test report

Save the test report

Run a single test script

To run a single test script:

  1. Ensure that you have created your test case/s and outcome set.
  2. Click the Execute button on your test script tab. NOTE: This will just run the currently active test script.
    The test script will run and the Test Report will be displayed on a new tab.

Run multiple test scripts

To run multiple scripts for a rulebase:

  1. In Oracle Policy Modeling, select Reports | Run Multiple Test Scripts...
    The Run Multiple Test Scripts dialog box will open.
  2. Select the scripts in your rulebase that you would like to run. Click Run Test Scripts. The selected test scripts will run and the Test Report will be displayed.

 

NOTE: You should re-run your test script/s whenever the rulebase changes to guarantee that the results are still correct.

View the test results

After a test script has run a tab will open in the top right hand pane in Oracle Policy Modeling which shows the Test Report.

An example of a Test Report for an individual test script is shown below. The report contains two sections: a summary of the report and the test case comparison results. There will also be an additional section for Errors if any are encountered during the running of the script.

Test cases that pass are highlighted in green and test cases that fail are highlighted in red.

 

 

If you have selected multiple test scripts to be run, the Test Report will open to a Test Script Result Summary. This show the Total Statistics for all the test scripts at the top of the report, and individual reports can be viewed by clicking on the links below this.

 

 

To navigate from individual reports back to the summary view, you click the Back button at the top left of the Test Report tab.

Customize the test report

Reports can be customized by changing the report options in File | Project Properties | Regression Tester Properties | Report Options.

 

 

 

The options in this dialog box are explained below:

Setting Options
Report type

The Test Report can be rendered in two distinct layouts – sequential or tabular.

  • the sequential layout lists results for cases down the page
  • the tabular layout presents results in a grid (cases rows and attribute columns)

Alternatively, you can specify a custom XSLT template for the regression tester to use when generating the Test Report.

Report heading styles

There are three options for report headings:

  • Outcome ID only - this will cause reports to display with attribute IDs (either model ID or public name) as headings.
  • Outcome display text only - this will cause reports to display with the value of the Display Text specified for the attribute in the outcome set.
  • Both outcome ID and display text - will display both the attribute IDs and the Display Text.
Omit from report

You have the option to omit from the Test Report:

  • Values that match - this excludes attributes with outcome values that match the test case, and/or
  • Test cases that pass - this excludes test cases that have passed.

Save the test report

You can save a test report by clicking the Save button at the top right of the Test Report tab.

If the Test Report is for an individual test script, the report will be saved as a HTML file.

If the Test Report contains multiple test script reports, you have the option to save the report in XML or HTML format. You need to specify a folder where the summary and individual report files will be saved to.